Implementing radial basis functions using bump-resistor networks - Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International
نویسنده
چکیده
Radial Basis Function (RBF) networks provide a powerful learning architecture for neural networks [6]. We have implemented a RBF network in analog VLSI using the concept of bump-resistors. A bump-resistor is a nonlinear resistor whose conductance is a Gaussian-like function of the difference of two other voltages. The width of the Gaussian basis functions may be continuously varied so that the aggregate interpolating function varies from a nearest-neighbor lookup, piecewise constant function to a globally smooth function. The bump-resis tor methodology extends to arbitrary dimensions while still preserving the radiality of the basis functions. The feedforward network architecture needs no additional circuitry other than voltage sources and the 1D bump-resistors. A nine-transistor variation of the Delbruck bump circuit is used to compute the Gaussian-like basis functions [2]. Below threshold the current output fits a Gaussian extremely well, see Figure 1. Figure 3 shows that the shape of the function deviates from the Gaussian shape above threshold. The width of the bump can to be varied by almost an order of magnitude (see Figure 4). The Delbruck bump circuit is shown in Figure A follower aggregation network shown in Figure 5 computes an average of the inputs voltages ci weighted by conductance values gi [4]:
منابع مشابه
Implementing Radial Basis Functions Using Bump-resistor Networks
1 Implementing Radial Basis Functions Using Bump-Resistor Networks John G. Harris University of Florida EE Dept., 436 CSE Bldg 42 Gainesville, FL 32611 [email protected] .edu Abstract| Radial Basis Function (RBF) networks provide a powerful learning architecture for neural networks [6]. We have implemented a RBF network in analog VLSI using the concept of bump-resistors. A bump-resistor is a ...
متن کاملConvergence in neural networks with interneuronal transmission delays - Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International
A neural network is a network of interconnected elementary units which have limited characteristic properties of real (or biological) neurons. Each unit is capable of receiving many inputs, some of which can activate the unit while other inputs can inhibit the activities of the unit. The neuron-like elementary unit computes a weighted sum of the input,s it receives and fires (or produces) a sin...
متن کاملEditorial: Welcome To The IEEE Neural Networks Society
I WANT towelcomeyou toournewly formedsociety.On February 17, 2002, the IEEE Neural Networks Council (NNC), publisher of the IEEE TRANSACTIONS ON NEURAL NETWORKS (TNN), the IEEE TRANSACTIONS ON FUZZY SYSTEMS (TFS), and the IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION (TEC), became the IEEE Neural Networks Society (NNS). This accomplishment was made possible by the relentless efforts of our ExCo...
متن کاملErrata to "Model Transitions in Descending FLVQ"
[1] K. J. Hunt, R. Hass, and R. Murray-Smith, “Extending the functional equivalence of radial basis function networks and fuzzy inference systems,” IEEE Trans. Neural Networks, vol. 7, pp. 776–781, May 1996. [2] J.-S. R. Jang, C.-T. Sun, and E. Mizutani, Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence, MATLAB Curriculum Series. Upper Saddle River, N...
متن کاملMachine Learning, Neural and Statistical Classification
Algorithms which construct classifiers from sample data -such as neural networks, radial basis functions, and decision trees -have attracted growing attention for their wide applicability. Researchers in the fields of Statistics, Artificial Intelligence, Machine Learning, Data Mining, and Pattern Recognition are continually introducing (or rediscovering) induction methods, and often publishing ...
متن کامل